Auf dieser Seite erhalten Sie eine detaillierte Analyse eines Wortes oder einer Phrase mithilfe der besten heute verfügbaren Technologie der künstlichen Intelligenz:
[neg'entrəpi]
общая лексика
негэнтропия
Смотрите также
существительное
программирование
негэнтропия
отрицательная энтропия
In information theory and statistics, negentropy is used as a measure of distance to normality. The concept and phrase "negative entropy" was introduced by Erwin Schrödinger in his 1944 popular-science book What is Life? Later, Léon Brillouin shortened the phrase to negentropy. In 1974, Albert Szent-Györgyi proposed replacing the term negentropy with syntropy. That term may have originated in the 1940s with the Italian mathematician Luigi Fantappiè, who tried to construct a unified theory of biology and physics. Buckminster Fuller tried to popularize this usage, but negentropy remains common.
In a note to What is Life? Schrödinger explained his use of this phrase.
... if I had been catering for them [physicists] alone I should have let the discussion turn on free energy instead. It is the more familiar notion in this context. But this highly technical term seemed linguistically too near to energy for making the average reader alive to the contrast between the two things.